Estimating Information-Theoretic Quantities
نویسندگان
چکیده
Definition Information theory is a practical and theoretical framework developed for the study of communication over noisy BLOCKIN BLOCKIN channels. BLOCKIN BLOCKIN Its BLOCKIN BLOCKIN probabilistic BLOCKIN BLOCKIN basis BLOCKIN BLOCKIN and BLOCKIN BLOCKIN capacity BLOCKIN BLOCKIN to BLOCKIN BLOCKIN relate BLOCKIN BLOCKIN statistical BLOCKIN BLOCKIN structure BLOCKIN BLOCKIN to BLOCKIN BLOCKIN function BLOCKIN BLOCKIN make BLOCKIN BLOCKIN it BLOCKIN BLOCKIN ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single trials, rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience – (see entry " Summary BLOCKIN BLOCKIN of BLOCKIN BLOCKIN Information-‐Theoretic BLOCKIN BLOCKIN Quantities "). BLOCKIN BLOCKIN Estimating BLOCKIN BLOCKIN these BLOCKIN BLOCKIN quantities BLOCKIN BLOCKIN in BLOCKIN BLOCKIN an BLOCKIN BLOCKIN accurate BLOCKIN BLOCKIN and BLOCKIN BLOCKIN unbiased way from real neurophysiological data frequently presents challenges, which are explained in this entry.
منابع مشابه
Estimating Functions of Distributions Defined over Spaces of Unknown Size
We consider Bayesian estimation of information-theoretic quantities from data, using a Dirichlet prior. Acknowledging the uncertainty of the event space size m and the Dirichlet prior’s concentration parameter c, we treat both as random variables set by a hyperprior. We show that the associated hyperprior, P (c,m), obeys a simple “Irrelevance of Unseen Variables” (IUV) desideratum iff P (c,m) =...
متن کاملOn Lower Bounds for Statistical Learning Theory
In recent years, tools from information theory have played an increasingly prevalent role in statistical machine learning. In addition to developing efficient, computationally feasible algorithms for analyzing complex datasets, it is of theoretical importance to determine whether such algorithms are “optimal” in the sense that no other algorithm can lead to smaller statistical error. This paper...
متن کاملInformation Theoretic Tools for Social Media
Information theory provides a powerful set of tools for discovering relationships among variables with minimal assumptions. Social media platforms provide a rich source of information than can include temporal, spatial, textual, and network information. What are the interesting information theoretic measures for social media and how can we estimate these quantities? I will discuss how measures ...
متن کاملInformation-Theoretic Methods for Identifying Relationships among Climate Variables
* Supported by NASA AIST-QRS-07-0001 AbstractInformation-theoretic quantities, such as entropy, are used to quantify the amount of information a given variable provides. Entropies can be used together to compute the mutual information, which quantifies the amount of information two variables share. However, accurately estimating these quantities from data is extremely challenging. We have devel...
متن کاملA Non-parametric Maximum Entropy Clustering
Clustering is a fundamental tool for exploratory data analysis. Information theoretic clustering is based on the optimization of information theoretic quantities such as entropy and mutual information. Recently, since these quantities can be estimated in non-parametric manner, non-parametric information theoretic clustering gains much attention. Assuming the dataset is sampled from a certain cl...
متن کامل